non-Cartesian is a 3-D structural world audiovisual, an embodied interactive 3-D that can be transformed and navigated, with full-body-movement interaction and dual control of interactive sound and navigation of 3-D. Designed for data-live 'embodied interactive 3-D environments'and screen-based quasi VR. A series of explorations in embodied movement perception in interactive 3-D audio-visuals installations. involving multi-disciplinary research in embodied cognition, a phenomenological perspective on body sensory perception, body-movement perception and human-computer-interaction (HCI)

Proprioception is the sense of how your limbs are oriented in space, which may be incorrect; it also provides information about movement derived from muscular, tendon, and articular sources. In VR glasses/headmount we depend on proprioception and in practice proprioceptive disruption/mutation, body-memory, can be transformed in VR as the body relearns, and stores body-memory. Proprioception is active all the time but becomes more predominant in VR and datalive environments.
Kinect output data for all 22 body parts.(2015)
Java and Max MSP, output and grouping of body movement data to 3-D navigation control and sound contro for free style digital-analogue synth interaction.
Content for class "gray" id "apDiv5" Goes Here
Max pres

EMBODIED INTERACTIVE 3-D AUDIO-VISUALS

EMBODIED INTERACTIVE 3-D AUDIO-VISUAL Created for suround proj ection space or VR

There were several non-Cartesian interactive designs, that have been have been tested out with users. The body externalizes and internalizes space via body perception from the interactive 3-D motion graphics and surround sound in the digital virtual environment. The interactive virtual environment affects the user in such a way as to make them feel the need for reorientation and the next searching apprehending movement is an extension of body-sense in virtual space These searching, apprehending interactive body movements of the user, create and find their own expression with the virtual 3-D objects in motion and sound. The interactive sound was designed to reciprocally affect the body moves, gestures, for interaction with the virtual 3-D objects. The relation of the sound with the 3-D motion graphics was designed to strengthen, the visual impact with corresponding sound but also building in sensitive tonal disparities to harness sensitivity in movement. The user could explore deeper aesthetic aspects of the motion graphics and sound, which caused conscious perception of normally unconscious proprioceptive movement, by transforming body proprioception, for an enacted aesthetic. (Xenakis, 2015)

Unity games development software engine can process data much faster than Max 7, used in experimental practice. It has a programming environment that can process large amount of live-data received from Kinect Sensors. For this reason, it has better motion graphics, and world scene development tools, 3-D structures are imported and configured in x, y, z, space into Unity. I used OpenNI source code to receive x, y, z, position body-motion and rotation data from the Kinect, for translating to interactive 3-D modulation.

note
note
note

  'Embodied Interaction': Interactive 3-D Audio-visual installations'.

'Embodied Interaction': Interactive 3-D Audio-visual installations'. Globe Gallery 2014.
 

 

 

 

  • Body-movement modulation sound. Full-body-movement data from skeleton sent to custom subtractor synth via OSC, and.Max patches. Presentation view showing console of custom interactive subtractor synth and interactive surround sound

  •